26 research outputs found

    On Sparse Coding as an Alternate Transform in Video Coding

    Get PDF
    In video compression, specifically in the prediction process, a residual signal is calculated by subtracting the predicted from the original signal, which represents the error of this process. This residual signal is usually transformed by a discrete cosine transform (DCT) from the pixel, into the frequency domain. It is then quantized, which filters more or less high frequencies (depending on a quality parameter). The quantized signal is then entropy encoded usually by a context-adaptive binary arithmetic coding engine (CABAC), and written into a bitstream. In the decoding phase the process is reversed. DCT and quantization in combination are efficient tools, but they are not performing well at lower bitrates and creates distortion and side effect. The proposed method uses sparse coding as an alternate transform which compresses well at lower bitrates, but not well at high bitrates. The decision which transform is used is based on a rate-distortion optimization (RDO) cost calculation to get both transforms in their optimal performance range. The proposed method is implemented in high efficient video coding (HEVC) test model HM-16.18 and high efficient video coding for screen content coding (HEVC-SCC) for test model HM-16.18+SCM-8.7, with a Bjontegaard rate difference (BD-rate) saving, which archives up to 5.5%, compared to the standard

    Anatomy of STEM Teaching in American Universities: A Snapshot from a Large-Scale Observation Study

    Get PDF
    National and local initiatives focused on the transformation of STEM teaching in higher education have multiplied over the last decade. These initiatives often focus on measuring change in instructional practices, but it is difficult to monitor such change without a national picture of STEM educational practices, especially as characterized by common observational instruments. We characterized a snapshot of this landscape by conducting the first large scale observation-based study. We found that lecturing was prominent throughout the undergraduate STEM curriculum, even in classrooms with infrastructure designed to support active learning, indicating that further work is required to reform STEM education. Additionally, we established that STEM faculty’s instructional practices can vary substantially within a course, invalidating the commonly-used teaching evaluations based on a one-time observation

    Anatomy of STEM Teaching in American Universities: A Snapshot from a Large-Scale Observation Study

    Get PDF
    National and local initiatives focused on the transformation of STEM teaching in higher education have multiplied over the last decade. These initiatives often focus on measuring change in instructional practices, but it is difficult to monitor such change without a national picture of STEM educational practices, especially as characterized by common observational instruments. We characterized a snapshot of this landscape by conducting the first large scale observation-based study. We found that lecturing was prominent throughout the undergraduate STEM curriculum, even in classrooms with infrastructure designed to support active learning, indicating that further work is required to reform STEM education. Additionally, we established that STEM faculty’s instructional practices can vary substantially within a course, invalidating the commonly-used teaching evaluations based on a one-time observation

    Quantifying the heterogeneity of macromolecular machines by mass photometry

    No full text
    © 2020, The Author(s). Sample purity is central to in vitro studies of protein function and regulation, and to the efficiency and success of structural studies using techniques such as x-ray crystallography and cryo-electron microscopy (cryo-EM). Here, we show that mass photometry (MP) can accurately characterize the heterogeneity of a sample using minimal material with high resolution within a matter of minutes. To benchmark our approach, we use negative stain electron microscopy (nsEM), a popular method for EM sample screening. We include typical workflows developed for structure determination that involve multi-step purification of a multi-subunit ubiquitin ligase and chemical cross-linking steps. When assessing the integrity and stability of large molecular complexes such as the proteasome, we detect and quantify assemblies invisible to nsEM. Our results illustrate the unique advantages of MP over current methods for rapid sample characterization, prioritization and workflow optimization

    First operation of the KATRIN experiment with tritium

    Get PDF
    The determination of the neutrino mass is one of the major challenges in astroparticle physics today. Direct neutrino mass experiments, based solely on the kinematics of β β -decay, provide a largely model-independent probe to the neutrino mass scale. The Karlsruhe Tritium Neutrino (KATRIN) experiment is designed to directly measure the effective electron antineutrino mass with a sensitivity of 0.2 eV 0.2 eV (90% 90% CL). In this work we report on the first operation of KATRIN with tritium which took place in 2018. During this commissioning phase of the tritium circulation system, excellent agreement of the theoretical prediction with the recorded spectra was found and stable conditions over a time period of 13 days could be established. These results are an essential prerequisite for the subsequent neutrino mass measurements with KATRIN in 2019
    corecore